Nonsmooth Analysis and Parametric Optimization
نویسنده
چکیده
In an optimization problem that depends on parameters, an important issue is the effect that perturbations of the parameters can have on solutions to the problem and their associated multipliers. Under quite broad conditions the possibly multi-valued mapping that gives these elements in terms of the parameters turns out to enjoy a property of " proto-differentiability. " Generalized derivatives can then be calculated by solving an auxiliary optimization problem with auxiliary parameters. This is constructed from the original problem by taking second-order epi-derivatives of an essential objective function. From an abstract point of view, a general optimization problem relative to elements x of a Banach space X can be seen in terms of minimizing an expression f (x) over all x ∈ X , where f is a function on X with values in IR = IR ∪ {±∞}. The effective domain dom f := x ∈ X f (x) < ∞ gives the " feasible " or " admissible " elements x. Under the assumption that f is lower semicontinuous and proper (the latter meaning that f (x) < ∞ for at least one x, but f (x) > −∞ for all x), a solution ¯ x to the problem must satisfy 0 ∈ ∂f (¯ x), where ∂f denotes subgradients in the sense of Clarke [1] (see also Rockafellar [2]). When f is convex, such subgradients coincide with the ones of convex analysis, and the condition 0 ∈ ∂f (¯ x) is not only necessary for optimality but sufficient. A substantial calculus, part of a broader subject called nonsmooth analysis, has been built up for determining the set ∂f (¯ x) in the case of particular structure of f. Dual elements such as Lagrange multipliers are often involved, and under convexity assumptions these typically solve a dual problem of optimization. It has long been known that in order to derive and interpret the dual elements appearing in optimality conditions, it is important to study optimization problems not in isolation but in parametrized form. Only recently, however, have the tools of analysis reached the stage where it is possible to analyze in a general and effective manner the dependence of
منابع مشابه
Generalized Derivatives of Differential-Algebraic Equations
Nonsmooth equation-solving and optimization algorithms which require local sensitivity information are extended to systems with nonsmooth parametric differential-algebraic equations embedded. Nonsmooth differential-algebraic equations refers here to semi-explicit differential-algebraic equations with algebraic equations satisfying local Lipschitz continuity and differential right-hand side func...
متن کاملStability Theory for Parametric Generalized Equations and Variational Inequalities via Nonsmooth Analysis
In this paper we develop a stability theory for broad classes of parametric generalized equations and variational inequalities in finite dimensions. These objects have a wide range of applications in optimization, nonlinear analysis, mathematical economics, etc. Our main concern is Lipschitzian stability of multivalued solution maps depending on parameters. We employ a new approach of nonsmooth...
متن کاملGeneralized Derivatives for Solutions of Parametric Ordinary Differential Equations with Non-differentiable Right-Hand Sides
Sensitivity analysis provides useful information for equation-solving, optimization, and post-optimality analysis. However, obtaining useful sensitivity information for systems with nonsmooth dynamic systems embedded is a challenging task. In this article, for any locally Lipschitz continuous mapping between finitedimensional Euclidean spaces, Nesterov’s lexicographic derivatives are shown to b...
متن کاملAn efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems
Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...
متن کاملSufficiency and duality for a nonsmooth vector optimization problem with generalized $alpha$-$d_{I}$-type-I univexity over cones
In this paper, using Clarke’s generalized directional derivative and dI-invexity we introduce new concepts of nonsmooth K-α-dI-invex and generalized type I univex functions over cones for a nonsmooth vector optimization problem with cone constraints. We obtain some sufficient optimality conditions and Mond-Weir type duality results under the foresaid generalized invexity and type I cone-univexi...
متن کاملLexicographic differentiation of nonsmooth functions
We present a survey on the results related to the theory of lexicographic differentiation. This theory ensures an efficient computation of generalized (lexicographic) derivative of a nonsmooth function belonging to a special class of lexicographically smooth functions. This class is a linear space which contains all differentiable functions, all convex functions, and which is closed with respec...
متن کامل